Skip to content

[CAI-749] Local parser#2007

Open
anemone008 wants to merge 38 commits intomainfrom
CAI-749-parser-url-crawler
Open

[CAI-749] Local parser#2007
anemone008 wants to merge 38 commits intomainfrom
CAI-749-parser-url-crawler

Conversation

@anemone008
Copy link
Collaborator

List of Changes

Add script for parsing to parser app. Parsed content is saved locally. Urls are sanitized for filesystems and used as file names.

Motivation and Context

How Has This Been Tested?

Tested for errors associated to non-existent or unreachable urls. Reproducible through npm test as described in the README.md

Screenshots (if appropriate):

Types of changes

  • Chore (nothing changes by a user perspective)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Checklist:

  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.

@changeset-bot
Copy link

changeset-bot bot commented Feb 9, 2026

🦋 Changeset detected

Latest commit: 2a1b9a0

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
parser Major

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@anemone008 anemone008 self-assigned this Feb 9, 2026
@anemone008 anemone008 requested a review from Copilot February 9, 2026 16:45
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a local “parser” CLI to crawl a site, extract page metadata, and persist results to disk with URL/filename sanitization.

Changes:

  • Introduces a Puppeteer-based crawler/metadata extractor and local JSON output.
  • Adds URL normalization + filesystem-safe filename sanitization utilities.
  • Adds build/test tooling (TypeScript + Jest) and an error-handling integration test.

Reviewed changes

Copilot reviewed 16 out of 18 changed files in this pull request and generated 10 comments.

Show a summary per file
File Description
apps/parser/src/parser.ts Adds the CLI entrypoint: reachability check, crawl orchestration, page parsing, persistence.
apps/parser/src/modules/crawler.ts Implements recursive crawl + scope filtering + link discovery.
apps/parser/src/modules/domActions.ts Expands interactive UI sections before scraping text/links.
apps/parser/src/modules/config.ts Resolves env-based configuration and output directory derivation.
apps/parser/src/modules/output.ts Creates output directory and writes JSON snapshots.
apps/parser/src/modules/errors.ts Centralizes fatal error handling/exit code.
apps/parser/src/modules/types.ts Adds typed metadata/node structures for crawl results.
apps/parser/src/utils/url.ts Adds URL normalization helpers and remote URL detection.
apps/parser/src/utils/sanitizeFilename.ts Adds filesystem-safe filename sanitization.
apps/parser/tests/parser.error-handling.test.ts Adds integration test for unreachable/nonexistent URL behavior.
apps/parser/package.json Adds build/parse/test scripts and required dependencies.
apps/parser/jest.config.ts Configures Jest + ts-jest for the parser app.
apps/parser/tsconfig.json Adds parser app TS config for dev/test typechecking.
apps/parser/tsconfig.build.json Adds build TS config emitting to dist/.
apps/parser/README.md Documents CLI usage, env vars, and tests.
.changeset/wide-hairs-fail.md Changeset entry for the new parser feature.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@anemone008 anemone008 marked this pull request as draft February 9, 2026 16:59
anemone008 and others added 2 commits February 10, 2026 12:04
@anemone008 anemone008 marked this pull request as ready for review February 10, 2026 13:56
Comment on lines 14 to 15
if (!input) {
return DEFAULT_REPLACEMENT;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i suggest to add a warning log so that we can understand if something went wrong and the DEFAULT_REPLACEMENT is used

return candidate;
}

export const UrlWithoutAnchors = (rawUrl: string): string => {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i suggest a name like RemoveAnchorsFromUrl, it's easier to understant what this does

try {
page = await browser.newPage();
await page.goto(node.url, {
waitUntil: "networkidle2",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we use a constant for this?

url.hash = "";
return UrlWithoutAnchors(url.toString());
} catch (_error) {
return rawUrl;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i suggest to add a warning here

}
}

async function persistSnapshot(snapshot: ParsedMetadata, FILENAME_LENGTH_THRESHOLD: number): Promise<void> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isn't FILENAME_LENGTH_THRESHOLD a const reachable inside this function ? Why do you need to pass it as a parameter?

0,
env.maxDepth,
parsedPages,
parsePageFn,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you have to pass the function here? Can't we export the function and call it inside parsePages?

Copy link
Collaborator

@MarBert MarBert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aggiungere un log con testo "Completed parsing of page [...]. Found [...] links. [nr liks parsed/nr links to parse]" per monitorare l'avanzamento del programma;
Rimuovere il trattino finale nei nomi dei file .json;
Gestire il problema del dominio con uno split al 1° slash dopo https://  e verifica che sia uguale al baseUrl oppure a {variante_valida}.dominio - aggiungere un TODO per soluzione più generale;
Diminuire FILENAME_LENGTH_THRESHOLD a 250, in modo che ci sia spazio per l'estensione nel filename;
Fare in modo che il nome della cartella dove sono salvati i json sia del tipo www.uqido.com, senza http... iniziale;
Chiamare _.json il file relativo alla homepage - poi chiedere conferma a Ciri se servono nomi specifici;
Rinominare la funzione parsePages in exploreAndParsePages;
Rinominare il file parser.ts in main.ts e spostare tutte le funzioni, a eccezione del lancio della ricorsione, negli helpers;
Rinominare crawler.ts in parser.ts e spostarci la funzione parsePageFn, rinominata in generatePageParsedMetadata;
Verificare quali sono i parametri che non è necessario passare alla ricorsione ed esportarli come un nuovo oggetto recursionMetadata;

- Update sanitizeUrlAsFilename to support length threshold and hash suffix for long URLs.
- Modify resolveEnv to parse validDomainVariants from environment variables.
- Refactor parsePages to utilize validDomainVariants for scope checking.
- Update EnvConfig type to include validDomainVariants.
- Improve tests for sanitizeUrlAsFilename to cover new functionality.
…me function so that for root it returns the hostname
…,remove constants from recursion parameters and enhance helpers
@github-actions
Copy link
Contributor

Branch is not up to date with base branch

@anemone008 it seems this Pull Request is not updated with base branch.
Please proceed with a merge or rebase to solve this.

@github-actions
Copy link
Contributor

github-actions bot commented Feb 13, 2026

Jira Pull Request Link

This Pull Request refers to the following Jira issue CAI-749

@github-actions
Copy link
Contributor

This PR exceeds the recommended size of 800 lines. Please make sure you are NOT addressing multiple issues with one PR. Note this PR might be rejected due to its size.

@MarBert MarBert self-requested a review February 13, 2026 16:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants